Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 10.0 years
5 - 7 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Key Skills Snowflake, SQL, AWS, DBT Must have Skills 3+ years of data engineering experience, including practical experience using Snowflake for Data Engineer tasks. A working knowledge of Restful APIs, SQL, semi-structured datasets, and cloud native concepts. Experience in ELT toolsFivetran, Qlik replicate, Matillion,Experience with DBT is must Experience creating stored procedures, UDFs in Snowflake Experience in Snowpark - python Source data from Data Lakes, APIs, and on-premises Transform, replicate, and share data across cloud platforms Design end-to-end near real-time streams Design scalable compute solutions for Data Engineer workloads Evaluate performance metrics
Posted 1 week ago
0.0 - 5.0 years
0 - 5 Lacs
Pune, Maharashtra, India
On-site
Role Overview In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers) . These centers are where we provide deep technical and industry expertise to a wide range of public and private sector clients globally. Our delivery centers offer clients locally based skills and technical expertise to drive innovation and the adoption of new technology. Your Role and Responsibilities Provide expertise in analysis, requirements gathering, design, coordination, customization, testing, and support of reports in the client's environment. Develop and maintain a strong working relationship with business and technical members of the team. Maintain a relentless focus on quality and continuous improvement . Perform root cause analysis of reports issues. Handle development and evolutionary maintenance of the environment, performance, capability, and availability. Assist in defining technical requirements and developing solutions. Ensure effective content and source-code management, troubleshooting, and debugging . Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise Tableau Desktop Specialist , SQL - Strong understanding of SQL for querying databases. Good to have: Python, Snowflake, Statistics, ETL experience. Extensive knowledge of creating impactful visualizations using Tableau . Must have a thorough understanding of SQL & advanced SQL (Joining & Relationships) . Must have experience working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge of creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Preferred Technical and Professional Experience Troubleshooting capabilities to debug Data controls. Capable of converting business requirements into workable models. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have a thorough understanding of SQL & advanced SQL (Joining & Relationships) .
Posted 1 week ago
2.0 - 5.0 years
2 - 5 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
This position will play a key role on theFirst Line Risk and Control team, supporting Consumer Monitoring & Testing and driving the implementation of horizontal Consumer risk programs. This individual will be responsible for executing risk-based testing, liasing with product, operations, compliance, and legal teams to ensure regulatory adherence. The role will also provide the opportunity to drive development and enhancement of risk and control programs Execute testing and monitoring of regulatory, policy and process compliance Gather and synthesize data to determine root causes and trends related to testing failures Propose effective and efficient methods to enhance testing and sampling strategies (including automation) to ensure the most effective risk detection, analyses and control solutions Proactively identify potential business risks, process deficiencies and improvement opportunities and make recommendations for additional controls and corrective action to enhance the efficiency and effectiveness of risk mitigation processes Maintain effective communication with stakeholders and support teams in remediation of testing errors; assist with implementation of corrective actions related to testing fails and non-compliance with policies and procedures Identify continuous improvement opportunities to meet changing requirements, driving maximum visibility to the executive audience Work closely with enterprise risk teams to ensure business line risks are being shared and rolled up to firm-wide risk summaries Your Skills 2-4 years of testing, audit, or compliance experience in consumer financial services Bachelor's degree or equivalent military experience Knowledge of applicable U.S. federal and state consumer lending laws and regulations as well as industry association standards, including, among others, Truth in Lending Act (Reg Z), Equal Credit Opportunity Act (Reg B), Fair Credit Reporting Act (Reg V), UDAAP Understanding of test automation framework like data driven, hybrid driven etc Knowledge of testing concepts, methodologies, and technologies Genuine excitement and passion for leading root cause analysis, troubleshooting technical process failures and implementing fixes to operationalize a process Analytical, critical thinking and problem solving skills Highly motivated self-starter with strong organizational skills, attention to detail, and the ability to remain organized in a fast-paced environment Interpersonal, and relationship management skills Integrity, ethical standards, and sound judgment; ability to exercise discretion with respect to sensitive information Ability to summarize observations and present in a clear, concise manner to peers, managers and senior Consumer Compliance management Quickly grasp complex concepts, including global business and regulatory matters Confidence in expressing a point of view with management Plus : CPA, Audit experience, CRCM, proficiency in Aquadata studio, Snowflake, Splunk, Excel macros,Tableau, Hadoop/PySpark/Spark/Python/R, CPA, Audit experience, CRCM
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 week ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant- Snowflake Data Engineer ( Snowflake+ Python+Cloud ) ! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snowpipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight , Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/ Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark . Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, & Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 week ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Pyspark /Python Data Engineer! We are looking for a passionate Python developer to join our team at Genpact. You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance , and professional growth and development opportunities. Responsibilities . Develop, test and maintain high-quality solutions using PySpark /Python programming language. . Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines. . Collaborate with cross-functional teams to identify and solve complex problems. . Write clean and reusable code that can be easily maintained and scaled. . Keep up to date with emerging trends and technologies in Python development. Qualifications we seek in you! Minimum qualifications . years of experience as a Python Developer with a strong portfolio of projects. . Bachelor%27s degree in Computer Science , Software Engineering or a related field. . Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF . In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy , Scipy , Pandas, Dask , spaCy , NLTK, Great Expectations, Splink and PyTorch . . Experience with data platforms such as Databricks/ Snowflake . Experience with front-end development using HTML or Python. . Familiarity with database technologies such as SQL and NoSQL. . Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications . Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid. . Knowledge of GenAI concepts and LLMs. . Contributions to open-source Python projects or active involvement in the Python community. Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up . Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 week ago
1.0 - 4.0 years
1 - 4 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
The Analyst/Associate Role (1-4 years of experience) involves dynamically collaborating across business and engineering teams, translating business problems into detailed data specifications and then designing, building and deploying scalable relational data models which would serve as the source for business user's/consumer's analytical use cases. The role requires end-to-end skills in data engineering, ETL, data modeling, distributed databases, math/logic and a good grasp SDLC best practices along with Data Governance aspect. In the course of building this data solution, the engineer will benefit from and be required to learn financial data engineering as it is performed at a top tier financial firm. Skills & Experience We Are Looking For Academic Qualifications: A Bachelors or Masters degree in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline) 1-4 years of relevant work experience in a global team-oriented environment Strong object-oriented design and hands on experience in one of programming languages (such as Java, Python, C++) using Object Oriented design techniques and best practices. Deep understanding of multidimensionality of data, data curation and data quality, such as traceability, security, performance latency and correctness across supply and demand processes In-depth knowledge of relational and columnar SQL databases, including database design Expertise in data warehousing concepts (e.g. star schema, entitlement implementations, SQL modeling, milestoning, indexing, partitioning) Excellent communications skills and the ability to work with subject matter experts to extract critical business concepts and gather business requirements Independent thinker, willing to engage, challenge or learn Ability to stay commercially focused and to always push for quantifiable commercial impact Strong work ethic, a sense of ownership and urgency Strong analytical and problem solving skills Ability to collaborate effectively across global teams and communicate complex ideas in a simple manner Preferred Qualifications Industry Experience in Data engineering. Exposure to cloud databases (such as Snowflake, Single Store). Exposure to cloud infrastructure (AWS, Azure, or GCP) and infrastructure as code (Terraform). Experience with programming for extract transform load (ETL) operations and data analysis.
Posted 1 week ago
2.0 - 4.0 years
2 - 4 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Role Overview: The Site Reliability Engineer team is responsible for design, implementation and end to end ownership of the infrastructure platform and services that protect the Trellix Securitys Consumer. The services provide continuous protection to our customers with a very strong focus on quality and an extendible services platform to internal partners & product teams. This role is a Site Reliability Engineer for commercial cloud-native solutions, deployed and managed in public cloud environments like AWS, GCP. You will be part of a team that is responsible for Trellix Cloud Services that enable protection at the endpoint products on a continuous basis. Responsibilities of this role include supporting Cloud service measurement, monitoring, and reporting, deployments and security. You will input into improving overall operational quality through common practices and by working with the Engineering, QA, and product DevOps teams. You will also be responsible for supporting efforts that improve Operational Excellence and Availability of Trellix Production environments. You will have access to the latest tools and technology, and an incredible career path with the worlds cyber security leader. You will have the opportunity to immerse yourself within complex and demanding deployment architectures and see the big picture all while helping to drive continuous improvement in all aspects of a dynamic and high-performing engineering organization. If you are passionate about running and continuously improving as a world class Site Reliability Engineer Team, we are offering you a unique and great opportunity to build your career with us and gain experience working with high-performance Cloud systems. About Role: Being part of a global 24x7x365 team providing the operational coverage including event response and recovery efforts of critical services. Periodic deployment of features, patches and hotfixes to maintain the Security posture of our Cloud Services. Ability to work in shifts on a rotational basis and participate in On-Call duties Have ownership and responsibility for high availability of Production environments Input into the monitoring of systems applications and supporting data Report on system uptime and availability Collaborate with other team members on best practices Assist with creating and updating runbooks & SOPs Build a strong relationship with the Cloud DevOps, Dev & QA teams and become a domain expert for the cloud services in your remit. Provided the required support for growth and development in this role. About you: 2 to 4 years of hands-on working experience in supporting production of large-scale cloud services. Strong production support background and experience of in-depth troubleshooting Experience working with solutions in both Linux and Windows environments Experience using modern Monitoring and Alerting tools (Prometheus, Grafana, PagerDuty, etc.) Excellent written and verbal communication skills. Experience with Python or other scripting languages Proven ability to work independently in deploying, testing, and troubleshooting systems. Experience supporting high availability systems and scalable solutions hosted on AWS or GCP. Familiarity with security tools & practices (Wiz, Tenable) Familiarity with Containerization and associated management tools (Docker, Kubernetes) Significant experience of developing and maintaining relationships with a wide range of customers at all levels Understanding of Incident, Change, Problem and Vulnerability Management processes. Desired: Awareness of ITIL best practices AWS Certification and/or Kubernetes Certification Experience with SnowFlake Automation/CI/CD experience, Jenkins, Ansible, Github Actions, Argo CD. Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each other's unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious ab out our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.
Posted 1 week ago
0.0 - 4.0 years
0 - 5 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
As an Engineer in the Risk Engineering organization, you will have the opportunity to impact one or more aspects of risk management. You will work with a team of talented engineers to drive the build & adoption of common tools, platforms, and applications. The team builds solutions that are offered as a software product or as a hosted service. We are a dynamic team of talented developers and architects who partner with business areas and other technology teams to deliver high profile projects using a raft of technologies that are fit for purpose (Java, Python, Snowflake , Cloud Computing ( AWS ), S3, Spark, ReactJS, among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. Basic Qualifications Bachelor's degree in Computer Science, Mathematics, Electrical Engineering or related technical discipline with 6 months - 2 years of working experience Willingness to learn new things and work with a diverse group of people across multiple geographical locations An eagerness to grow as a Software Engineer Proficiency in Java, Python or another Object-Oriented Programming language A clear understanding of data structures, algorithms, software design and core programming concepts Preferred Qualifications Experience with one or more major relational / object databases including cloud databases like Snowflake Interact with business users for resolving issues with applications. Performance tune applications to improve memory and CPU utilization.
Posted 1 week ago
1.0 - 2.0 years
1 - 2 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
As an Engineer in the Risk Engineering organization, you will have the opportunity to impact one or more aspects of risk management. You will work with a team of talented engineers to drive the build & adoption of common tools, platforms, and applications. The team builds solutions that are offered as a software product or as a hosted service. We are a dynamic team of talented developers and architects who partner with business areas and other technology teams to deliver high profile projects using a raft of technologies that are fit for purpose (Java, Cloud computing, HDFS, Spark, S3, ReactJS, Sybase IQ among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. WHAT WE LOOK FOR Senior Developer in large projects across a global team of developers and risk managers Performance tune applications to improve memory and CPU utilization. Perform statistical analyses to identify trends and exceptions related Market Risk metrics. Build internal and external reporting for the output of risk metric calculation using data extraction tools, such as SQL, and data visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code review and production migrations. Skills And Experience Bachelor's degree in Computer Science, Mathematics, Electrical Engineering or related technical discipline 1-2 years experience is working risk technology team in another bank, financial institution. Experience in market risk technology is a plus. Experience with one or more major relational / object databases. Experience in software development, including a clear understanding of data structures, algorithms, software design and core programming concepts Comfortable multi-tasking, managing multiple stakeholders and working as part of a team Comfortable with working with multiple languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Should be ready to work in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through entire life cycle of the project from start to end
Posted 1 week ago
0.0 - 3.0 years
0 - 3 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
To ensure uncompromising accuracy and timeliness in the delivery of the risk metrics, our platform is continuously growing and evolving. Credit Risk Engineering combines the principles of Computer Science, Mathematics and Finance to produce large scale, computationally intensive calculations of risk Goldman Sachs faces with each transaction we engage in. Credit Risk Engineering has an opportunity for an Analyst level Software Engineer to work across a broad range of applications and extremely diverse set of technologies to keep the suite operating at peak efficiency. As an Engineer in the Risk Engineering organization, you will have the opportunity to impact one or more aspects of risk management. You will work with a team of talented engineers to drive the build & adoption of common tools, platforms, and applications. The team builds solutions that are offered as a software product or as a hosted service. We are a dynamic team of talented developers and architects who partner with business areas and other technology teams to deliver high profile projects using a raft of technologies that are fit for purpose (Java, Python, Snowflake , Cloud Computing ( AWS ), S3, Spark, ReactJS, among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. Basic Qualifications Bachelor's degree in Computer Science, Mathematics, Electrical Engineering or related technical discipline with 6 months - 2 years of working experience Willingness to learn new things and work with a diverse group of people across multiple geographical locations An eagerness to grow as a Software Engineer Proficiency in Java, Python or another Object-Oriented Programming language A clear understanding of data structures, algorithms, software design and core programming concepts Preferred Qualifications Experience with one or more major relational / object databases including cloud databases like Snowflake Interact with business users for resolving issues with applications. Performance tune applications to improve memory and CPU utilization.
Posted 1 week ago
3.0 - 8.0 years
16 - 17 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Role - AI/ ML Engineer Location - Bangalore We are seeking a highly skilled and innovative AI Data Engineer to join our Development team. In this role, you will design, develop, and deploy AI systems that can generate content, reason autonomously, and act as intelligent agents in dynamic environments. Key Responsibilities Design and implement generative AI models (e.g., LLMs, diffusion models) for text, image, audio, or multimodal content generation. Develop agentic AI systems capable of autonomous decision-making, planning, and tool use in complex environments. Integrate AI agents with APIs, databases, and external tools to enable real-world task execution. Fine-tune foundation models for domain-specific applications using techniques like RLHF, prompt engineering, and retrieval-augmented generation (RAG). Collaborate with cross-functional teams including product, design, and engineering to bring AI-powered features to production. Conduct research and stay up to date with the latest advancements in generative and agentic AI. Ensure ethical, safe, and responsible AI development practices. Required Qualifications Bachelor's or Master's degree in Computer Science, AI, Machine Learning, or a related field. 3+ years of experience in machine learning, with a focus on generative models or autonomous agents. Proficiency in Python and ML frameworks such as PyTorch Experience with LLMs (e.g., GPT, Claude, LLaMA, Cortex), transformers, and diffusion models. Familiarity with agent frameworks (e.g., LangChain, AutoGPT, ReAct, OpenAgents). Experience with AWS and Snowflake services Prior Healthcare experience Strong understanding of reinforcement learning, planning algorithms, and multi-agent systems. Excellent problem-solving and communication skills.
Posted 1 week ago
2.0 - 5.0 years
2 - 5 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Senior Snowflake Database Engineer who excels in developing complex queries and stored procedures. The ideal candidate should have a deep understanding of Snowflake architecture and performance tuning techniques. He / She will work closely with application engineers to integrate database solutions seamlessly into applications, ensuring optimal performance and reliability. Key Skills and Responsibilities: Strong expertise in Snowflake, including data modeling, query optimization, and performance tuning. Proficiency in writing complex SQL queries, stored procedures, and functions. Experience with database performance tuning techniques, including indexing and query profiling. Familiarity with integrating database solutions into application code and workflows. Knowledge of data governance and data quality best practices is a plus. Strong analytical and problem-solving skills along with excellent communication skills to collaborate effectively
Posted 1 week ago
7.0 - 12.0 years
7 - 12 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Job Summary: We are looking for a highly skilled Data Engineer with hands-on experience in Snowflake, Python, DBT, and modern data architecture. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and data warehouse solutions that support analytics and business intelligence initiatives. Key Responsibilities: Design and implement scalable data pipelines using ETL/ELT frameworks. Develop and maintain data models and data warehouse architecture using Snowflake. Build and manage DBT (Data Build Tool) models for data transformation and lineage tracking. Write efficient and reusable Python scripts for data ingestion, transformation, and automation. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Ensure data quality, integrity, and governance across all data platforms. Monitor and optimize performance of data pipelines and queries. Implement best practices for data engineering, including version control, testing, and CI/CD. Required Skills and Qualifications: 8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.
Posted 1 week ago
8.0 - 13.0 years
8 - 13 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.
Posted 1 week ago
8.0 - 13.0 years
8 - 13 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Proven experience as a Data Modeler or in a similar role (8 years depending on seniority level). Proficiency in data modeling tools (e.g., ER/Studio, Erwin, SAP PowerDesigner, or similar). Strong understanding of database technologies (e.g., SQL Server, Oracle, PostgreSQL, Snowflake). Experience with cloud data platforms (e.g., AWS, Azure, GCP). Familiarity with ETL processes and tools. Excellent knowledge of normalization and denormalization techniques. Strong analytical and problem-solving skills.
Posted 1 week ago
6.0 - 9.0 years
15 - 22 Lacs
Pune, Maharashtra, India
Remote
Data Engineer DBT, Snowflake, Looker Location: remote Experience: 710 years About the Role We are looking for an experienced Data Engineer to design and build scalable data pipelines and enable powerful business insights. You'll work with modern data stack tools like DBT, Snowflake , and Looker to empower data-driven decisions. Key Responsibilities Design & maintain scalable data pipelines (DBT, Snowflake) Perform data transformation , cleansing & enrichment Integrate data from multiple sources into data warehouse/data lake Support reporting, analytics & BI with Looker / similar tools Optimize performance & troubleshoot data workflows Document processes & ensure data quality Skills Required DBT, Snowflake, Looker (or similar tools) Strong SQL , Python (or similar scripting) Data modeling, schema design, database optimization Problem-solving & business requirement translation Excellent communication & cross-functional collaboration
Posted 1 week ago
6.0 - 9.0 years
15 - 22 Lacs
Bengaluru / Bangalore, Karnataka, India
Remote
Data Engineer DBT, Snowflake, Looker Location: remote Experience: 710 years About the Role We are looking for an experienced Data Engineer to design and build scalable data pipelines and enable powerful business insights. You'll work with modern data stack tools like DBT, Snowflake , and Looker to empower data-driven decisions. Key Responsibilities Design & maintain scalable data pipelines (DBT, Snowflake) Perform data transformation , cleansing & enrichment Integrate data from multiple sources into data warehouse/data lake Support reporting, analytics & BI with Looker / similar tools Optimize performance & troubleshoot data workflows Document processes & ensure data quality Skills Required DBT, Snowflake, Looker (or similar tools) Strong SQL , Python (or similar scripting) Data modeling, schema design, database optimization Problem-solving & business requirement translation Excellent communication & cross-functional collaboration
Posted 1 week ago
6.0 - 9.0 years
15 - 22 Lacs
Chennai, Tamil Nadu, India
Remote
Data Engineer DBT, Snowflake, Looker Location: remote Experience: 710 years About the Role We are looking for an experienced Data Engineer to design and build scalable data pipelines and enable powerful business insights. You'll work with modern data stack tools like DBT, Snowflake , and Looker to empower data-driven decisions. Key Responsibilities Design & maintain scalable data pipelines (DBT, Snowflake) Perform data transformation , cleansing & enrichment Integrate data from multiple sources into data warehouse/data lake Support reporting, analytics & BI with Looker / similar tools Optimize performance & troubleshoot data workflows Document processes & ensure data quality Skills Required DBT, Snowflake, Looker (or similar tools) Strong SQL , Python (or similar scripting) Data modeling, schema design, database optimization Problem-solving & business requirement translation Excellent communication & cross-functional collaboration
Posted 1 week ago
14.0 - 24.0 years
14 - 24 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 2 weeks ago
10.0 - 14.0 years
10 - 14 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
What you will do In this vital role you will for designing, developing, and maintaining software solutions for Research scientists. Additionally, it involves automating operations, monitoring system health, and responding to incidents to minimize downtime. You will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements scientific software platforms that enable the capture, analysis, storage, and report of in vitro assays and in vivo / pre-clinical studies as well as those that manage compound inventories / biological sample banks. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles & Responsibilities: Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Possesses strong rapid prototyping skills and can quickly translate concepts into working code Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team including scientists, and other collaborators Roles & Responsibilities: Project & Portfolio Delivery Lead the execution of initiatives across the data platforms portfolio, ensuring projects are delivered on time, within scope, and to expected quality standards. Coordinate cross-functional teams (Business, engineering, architecture, operations, governance) to deliver tools, technologies and platforms. Lead the initiatives for evaluating latest market technologies in the area of data Engineering & Management & Governance Financial Management Own and manage project and portfolio budgets, including tracking actuals vs forecasts, accruals, and reporting on financial performance to stakeholders. Partner with Finance, Procurement, and Vendor Management teams to support contract reviews, Platform costs. Proactively monitor financial risks and ensure alignment of project spend with approved business cases and funding models. Prepare financial summaries and variance reports for leadership and program steering committees. Planning & Governance Maintain integrated plans and roadmaps across projects within the data platforms portfolio. Run governance forums, manage stakeholder expectations, and ensure project artifacts, status reports, and RAID logs are consistently maintained. Stakeholder & Communication Management Serve as the central point of contact between technical teams, business stakeholders, and vendors. Lead project steering committee meetings and provide clear and concise updates to senior leadership. Agile & Hybrid Delivery Apply agile, SAFe or hybrid delivery methods based on project needs, support backlog grooming, sprint planning, and release planning. Promote continuous improvement in delivery through retrospectives and feedback loops. Must Have skills: Demonstrated experience managing project financials (budgeting, forecasting, variance analysis, cost optimization) Experience working in large, complex enterprise environments with cross-functional stakeholders Familiarity with modern data platforms such as Azure Data Lake, Databricks, Snowflake, Synapse, Kafka, Delta Lake, etc. Strong understanding of data management lifecycle, data architecture, and platform components (ingestion, processing, governance, access) Excellent interpersonal, presentation, and negotiation skills PMP, PMI-ACP, SAFe, or equivalent certifications are a plus Basic Qualifications and Experience: Masters degree with 8-10+ years of experience in Business, Engineering, IT or related field OR Bachelors degree with 10-14+ years of experience in Business, Engineering, IT or related field OR Diploma with 14+ years of experience in Business, Engineering, IT or related field Good-to-Have Skills: Strong understanding of Cloud Infrastructure, Data & Analytics tools like Databricks, Informatica, PowerBI, Tableau and Data Governance technologies Experience with cloud (e.g. AWS) and on-premises compute infrastructure Experience with Databricks platform. Professional Certifications : Project Managerment Certifications Agile Certified Practitioner (preferred) AWS certification Soft Skills: Excellent interpersonal, presentation, and negotiation skills Strong analytical abilities to assess and improve data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 325686 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Minimum Experience on Key Skills - 5 to 10 years Skills: AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) We looking for operational engineer who is ready to work on weekends for oncall as primary criteria. Skills we look for AWS cloud (SQS, SNS, , DynomoDB, EKS), SQL (postgress, cassendra), snowflake, ControlM/Autosys/Airflow, ServiceNow, Datadog, Splunk, Grafana, python/shell scripting. #LI-INPAS About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 2 weeks ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Advisor to join our team in Bengaluru, Karn?taka (IN-KA), India (IN). Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark and/or Azure Databricks. Developing scalable and re-usable frameworks for ingesting of geospatial data sets. Mandatory Skills- Tech stack: Azure Databricks, ADF, Azure Synapse, PySpark nice to have: Snowflake About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 2 weeks ago
10.0 - 12.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 323226 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Sr. Advisor to join our team in Bengaluru, India, Karn?taka (IN-KA), India (IN). Key Responsibilities: Design data platform architectures (data lakes, lakehouses, DWH) using modern cloud-native tools (e.g., Databricks, Snowflake, BigQuery, Synapse, Redshift). Architect data ingestion, transformation, and consumption pipelines using batch and streaming methods. Enable real-time analytics and machine learning through scalable and modular data frameworks. Define data governance models, metadata management, lineage tracking, and access controls. Collaborate with AI/ML, application, and business teams to identify high-impact use cases and optimize data usage. Lead modernization initiatives from legacy data warehouses to cloud-native and distributed architectures. Enforce data quality and observability practices for mission-critical workloads. Required Skills: 10+ years in data architecture, with strong grounding in modern data platforms and pipelines. Deep knowledge of SQL/NoSQL, Spark, Delta Lake, Kafka, ETL/ELT frameworks. Hands-on experience with cloud data platforms (AWS, Azure, GCP). Understanding of data privacy, security, lineage, and compliance (GDPR, HIPAA, etc.). Experience implementing data mesh/data fabric concepts is a plus. Expertise in technical solutions writing and presenting using tools such as Word, PowerPoint, Excel, Visio etc. High level of executive presence to be able to articulate the solutions to CXO level executives. Preferred Qualifications: Certifications in Snowflake, Databricks, or cloud-native data platforms. Exposure to AI/ML data pipelines, MLOps, and real-time data applications. Familiarity with data visualization and BI tools (Power BI, Tableau, Looker, etc.). About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2